Concentration of Posterior Model Probabilities and Normalized L0 Criteria

نویسندگان

چکیده

We study frequentist properties of Bayesian and $L_0$ model selection, with a focus on (potentially non-linear) high-dimensional regression. propose construction to how posterior probabilities normalized criteria concentrate the (Kullback-Leibler) optimal other subsets space. When such concentration occurs, one also bounds selecting correct model, type I II errors. These results hold generally, help validate use control error associated selection hypothesis tests. Regarding regression, we understand effect sparsity imposed by prior or penalty, problem characteristics as sample size, signal-to-noise, dimension true sparsity. A particular finding is that may less sparse formulations than would be asymptotically optimal, but still attain consistency often significantly better finite-sample performance. prove new related misspecifying mean covariance structures, give tighter rates for certain non-local priors currently available.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Normalized Tenacity and Normalized Toughness of Graphs

In this paper, we introduce the novel parameters indicating Normalized Tenacity ($T_N$) and Normalized Toughness ($t_N$) by a modification on existing Tenacity and Toughness parameters.  Using these new parameters enables the graphs with different orders be comparable with each other regarding their vulnerabilities. These parameters are reviewed and discussed for some special graphs as well.

متن کامل

Assessment of two approximation methods for computing posterior model probabilities

Model selection is an important problem in statistical applications. Bayesian model averaging provides an alternative to classical model selection procedures and allows researchers to consider several models from which to draw inferences. In the multiple linear regression case, it is di4cult to compute exact posterior model probabilities required for Bayesian model averaging. To reduce the comp...

متن کامل

Bayesian posterior probabilities: revisited

Huelsenbeck and Rannala (2004, Systematic Biology 53, 904–913) presented a series of simulations in order to assess the extent to which the bayesian posterior probabilities associated with phylogenetic trees represent the standard frequentist statistical interpretation. They concluded that when the analysis model matches the generating model then the bayesian posterior probabilities are correct...

متن کامل

Classifier Conditional Posterior Probabilities

Classifiers based on probability density estimates can be used to find posterior probabilities for the objects to be classified. These probabilities can be used for rejection or for combining classifiers. Posterior probabilities for other classifiers, however, have to be conditional for the classifier., i.e. they yield class probabilities for a given value of the classifier outcome instead for ...

متن کامل

Anisotropic Smoothing of Posterior Probabilities

Recently, we proposed an efficient image segmentation technique that anisotropically smoothes the homogeneous posterior probabilities before independent pixelwise MAP classification is carried out [11]. In this paper, we develop the mathematical theory underlying the technique. We demonstrate that prior anisotropicsmoothingof the posterior probabilities yields the MAP solution of a discrete MRF...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Bayesian Analysis

سال: 2022

ISSN: ['1936-0975', '1931-6690']

DOI: https://doi.org/10.1214/21-ba1262